Local-global nested graph kernels using nested complexity traces
نویسندگان
چکیده
منابع مشابه
Optimizing Nested Loops Using Local CPS Conversion
Local CPS conversion is a compiler transformation for improving the code generated for nested loops by a direct-style compiler that uses recursive functions to represent loops. The transformation selectively applies CPS conversion at non-tail call sites, which allows the compiler to use a single machine procedure and stack frame for both the caller and callee. In this paper, we describe LCPS co...
متن کاملComplexity of Nested Circumscription and Nested Abnormality Theories
Circumscription has been recognized as an important principle for knowledge representation and common-sense reasoning. The need for a circumscriptive formalism that allows for simple yet elegant modular problem representation has led Lifschitz (AIJ, 1995) to introduce nested abnormality theories (NATs) as a tool for modular knowledge representation, tailored for applying circumscription to mini...
متن کاملKernels on Structured Objects Through Nested Histograms
We propose a family of kernels for structured objectswhich is based on the bag-of-components paradigm. Rather than decomposing each complex object into the single histogram of its components, we use for each object a family of nested histograms, where each histogram in this hierarchy describes the object seen from an increasingly granular perspective. We use this hierarchy of histograms to defi...
متن کاملGlobal graph kernels using geometric embeddings
Applications of machine learning methods increasingly deal with graph structured data through kernels. Most existing graph kernels compare graphs in terms of features defined on small subgraphs such as walks, paths or graphlets, adopting an inherently local perspective. However, several interesting properties such as girth or chromatic number are global properties of the graph, and are not capt...
متن کاملLearning Kernels Using Local Rademacher Complexity
We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using exi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition Letters
سال: 2020
ISSN: 0167-8655
DOI: 10.1016/j.patrec.2018.06.016